Log-majorization related to Rényi divergences

نویسندگان

چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Rényi divergence and majorization

Rényi divergence is related to Rényi entropy much like information divergence (also called Kullback-Leibler divergence or relative entropy) is related to Shannon’s entropy, and comes up in many settings. It was introduced by Rényi as a measure of information that satisfies almost the same axioms as information divergence. We review the most important properties of Rényi divergence, including it...

متن کامل

A variational characterization of Rényi divergences

Atar, Chowdhary and Dupuis have recently exhibited a variational formula for exponential integrals of bounded measurable functions in terms of Rényi divergences. We develop a variational characterization of the Rényi divergences between two probability distributions on a measurable space in terms of relative entropies. When combined with the elementary variational formula for exponential integr...

متن کامل

Infinite-dimensional Log-Determinant divergences II: Alpha-Beta divergences

This work presents a parametrized family of divergences, namely Alpha-Beta LogDeterminant (Log-Det) divergences, between positive definite unitized trace class operators on a Hilbert space. This is a generalization of the Alpha-Beta Log-Determinant divergences between symmetric, positive definite matrices to the infinite-dimensional setting. The family of Alpha-Beta Log-Det divergences is highl...

متن کامل

Log-Determinant Divergences Revisited: Alpha-Beta and Gamma Log-Det Divergences

In this paper, we review and extend a family of log-det divergences for symmetric positive definite (SPD) matrices and discuss their fundamental properties. We show how to generate from parameterized Alpha-Beta (AB) and Gamma log-det divergences many well known divergences, for example, the Stein’s loss, S-divergence, called also Jensen-Bregman LogDet (JBLD) divergence, the Logdet Zero (Bhattac...

متن کامل

APPROXIMATION AND ESTIMATION OF s-CONCAVE DENSITIES VIA RÉNYI DIVERGENCES.

In this paper, we study the approximation and estimation of s-concave densities via Rényi divergence. We first show that the approximation of a probability measure Q by an s-concave density exists and is unique via the procedure of minimizing a divergence functional proposed by [Ann. Statist.38 (2010) 2998-3027] if and only if Q admits full-dimensional support and a first moment. We also show c...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Linear Algebra and its Applications

سال: 2019

ISSN: 0024-3795

DOI: 10.1016/j.laa.2018.11.004